Cloud, Kubernetes, Serverless

JAX London Blog

JAX London, 08-11 October 2018
The Conference for JAVA & Software Innovation

Arianna Aondio

Which horse should you back in the delivery architecture race? In this article, Arianna Aondio lays out the competitions and helps you evaluate the pros and cons.

Two competing approaches – cloud and edge – are in a race to become the winning content delivery architecture. If you browse the web, you get the impression that edge computing is the clear favorite.

Before we discuss whether and when to choose edge over cloud (or combine the two) let’s dig deeper into “edge computing”. What is edge computing and why is there is so much hype about a term that RedMonk analyst James Governor described in a recent webinar as “nebulous”.

Edge computing is all about pushing application and logic to the extremes of a network. The need for it arose with the Internet of Things (IoT). As IoT relies on many distributed devices and sensors talking to each other, data collection and processing must be handled to suit this new paradigm. Instead of transmitting lots of data from the device to the backend, the idea arose to push the logic onto the device itself, reducing latency and processing times, improving performance and allowing the device to become “smart”. The broader definition of edge computing includes any strategy that moves logic closer to the end user – within the device or on a server living on an architecture layer close to the client browser.

Before you place all your bets on the single horse called edge computing, you should evaluate if your cloud layer is actually sufficient to fulfill your content delivery needs.

This broader definition includes the strategy to combine content delivery with edge computing and edge caching. You are likely familiar with caching, or creating a temporary storage area that mirrors a site or application’s content. The cache serves a visitors’ request to a site or application, eliminating the need for the server to fetch the requested content from the backend. Consequently, server overload, the most common cause for slow content delivery is removed.

Let’s have a look at a simplified content delivery architecture: apart from the origin servers producing the content, you have a cloud layer responsible for the “heavy lifting” where this content is usually cached and distributed. You might also have a fog layer to improve network and bandwidth. Finally, the edge layer is where data is collected, processed and filtered.

Depending on how flexible your cache software is, you can add caching to any of those layers. Adding caching to the edge layer reduces the response time as the request doesn’t have to travel all the way back to the cloud. This not only offloads your cloud layer but also improves the performance of the content delivery (lower response latencies), improves security (authentication and authorization) and provides a better user/customer experience. Also, having more granular control at the edge increases the flexibility of content delivery as you stay in control of the HTTP behaviour.

However, introducing edge caching also means more maintenance plus the need for scheduling and provisioning to keep the servers up and running. Therefore, before you place all your bets on the single horse called edge computing, you should evaluate if your cloud layer is actually sufficient to fulfill your content delivery needs.

The following questions help you evaluate your needs and find the right tradeoffs that are in line with your business requirements:

Do I really need to process logic on the edge or can I afford to wait until the user requests get to the cloud layer?
Background: Shuffling data back and forth is quite expensive, as bandwidth limitations are still a big issue in the content delivery world. If you already know that your network gets easily congested, it makes sense to run logic on an edge layer.

How much user personalisation does my website have?
Background: For an e-commerce site, user personalisation is mandatory. The better it is, the more the site will likely sell. However, user-personalised content requires more roundtrips and more bytes to be exchanged between the client browser and the servers with the content. Therefore for personalised e-commerce sites, using at least some degree of edge logic or edge caching makes a lot of sense and is a good investment to support the personalisation initiatives.

Do I use a lot of streaming to deliver the content?
Background: For streaming you don’t really need much edge logic, instead you rather would purchase extra cloud instances very close to the audience you are streaming to.

Where is my audience located?
Background: If your users are distributed in several geographic location it might make sense to have many distributed edge caches to provide a low-latency experience also to the most remote consumer.

Cloud & Modern Infrastructure track at JAX London 2018

 

Top Articles About Cloud, Kubernetes, Serverless

STAY TUNED!

JOIN OUR NEWSLETTER

Behind the Tracks

Software Architecture & Design
Software innovation & more
Microservices
Architecture structure & more
Agile & Communication
Methodologies & more
DevOps & Continuous Delivery
Delivery Pipelines, Testing & more
Big Data & Machine Learning
Saving, processing & more

JOIN OUR UPCOMING EVENTS IN LONDON!